2025-03-10 17:59:04.AIbase.16.1k
ByteDance Open-Sources COMET: A Technology Boosting Large Model Training Efficiency by 1.7x
ByteDance's Doubao large model team recently announced a breakthrough in addressing key bottlenecks in Mixture-of-Experts (MoE) architecture, open-sourcing a significant optimization technology called COMET. This technology dramatically improves large model training efficiency, achieving a remarkable 1.7x speedup and a 40% reduction in training costs. Image Note: Image generated by AI, image licensing provider Midjourney. COMET has been deployed in ByteDance's multi-thousand-GPU cluster training, resulting in millions of GPU hours saved.